首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7389篇
  免费   283篇
  国内免费   48篇
管理学   322篇
民族学   28篇
人才学   4篇
人口学   64篇
丛书文集   616篇
理论方法论   135篇
综合类   4808篇
社会学   139篇
统计学   1604篇
  2024年   9篇
  2023年   42篇
  2022年   39篇
  2021年   50篇
  2020年   86篇
  2019年   142篇
  2018年   154篇
  2017年   236篇
  2016年   149篇
  2015年   194篇
  2014年   344篇
  2013年   701篇
  2012年   488篇
  2011年   399篇
  2010年   375篇
  2009年   370篇
  2008年   418篇
  2007年   484篇
  2006年   499篇
  2005年   436篇
  2004年   376篇
  2003年   344篇
  2002年   291篇
  2001年   289篇
  2000年   178篇
  1999年   92篇
  1998年   84篇
  1997年   73篇
  1996年   50篇
  1995年   62篇
  1994年   57篇
  1993年   45篇
  1992年   37篇
  1991年   23篇
  1990年   19篇
  1989年   31篇
  1988年   20篇
  1987年   12篇
  1986年   7篇
  1985年   1篇
  1984年   1篇
  1983年   4篇
  1981年   2篇
  1980年   3篇
  1979年   2篇
  1978年   1篇
  1977年   1篇
排序方式: 共有7720条查询结果,搜索用时 421 毫秒
1.
随着信息技术的发展,数字经济已经成为经济增长的"新引擎"。但由于缺乏权威的产业统计分类标准,学者们一直面临"数字经济研究缺乏数字依据"的尴尬境地。文章基于国家统计局公布并实施的《数字经济及其核心产业统计分类(2021)》中的分类标准,对各省份统计年鉴的数据进行重新整理,利用熵权法构建数字经济发展指数,测度了我国30个省份的数字经济发展水平,分析了各省份数字经济发展的差异以及时空特征。研究发现,2009—2019年我国数字经济产业发展迅猛,各项子产业都取得了长足的进步。相比较而言,数字要素驱动业发展速度略低于其他三个子产业;数字经济发展存在着明显的区域不平衡。东中部地区的数字经济发展状况明显优于西部地区,南方优于北方,而且区域不平衡有持续扩大趋势。  相似文献   
2.
This paper proposes a new hysteretic vector autoregressive (HVAR) model in which the regime switching may be delayed when the hysteresis variable lies in a hysteresis zone. We integrate an adapted multivariate Student-t distribution from amending the scale mixtures of normal distributions. This HVAR model allows for a higher degree of flexibility in the degrees of freedom for each time series. We use the proposed model to test for a causal relationship between any two target time series. Using posterior odds ratios, we overcome the limitations of the classical approach to multiple testing. Both simulated and real examples herein help illustrate the suggested methods. We apply the proposed HVAR model to investigate the causal relationship between the quarterly growth rates of gross domestic product of United Kingdom and United States. Moreover, we check the pairwise lagged dependence of daily PM2.5 levels in three districts of Taipei.  相似文献   
3.
Proportional hazards are a common assumption when designing confirmatory clinical trials in oncology. This assumption not only affects the analysis part but also the sample size calculation. The presence of delayed effects causes a change in the hazard ratio while the trial is ongoing since at the beginning we do not observe any difference between treatment arms, and after some unknown time point, the differences between treatment arms will start to appear. Hence, the proportional hazards assumption no longer holds, and both sample size calculation and analysis methods to be used should be reconsidered. The weighted log‐rank test allows a weighting for early, middle, and late differences through the Fleming and Harrington class of weights and is proven to be more efficient when the proportional hazards assumption does not hold. The Fleming and Harrington class of weights, along with the estimated delay, can be incorporated into the sample size calculation in order to maintain the desired power once the treatment arm differences start to appear. In this article, we explore the impact of delayed effects in group sequential and adaptive group sequential designs and make an empirical evaluation in terms of power and type‐I error rate of the of the weighted log‐rank test in a simulated scenario with fixed values of the Fleming and Harrington class of weights. We also give some practical recommendations regarding which methodology should be used in the presence of delayed effects depending on certain characteristics of the trial.  相似文献   
4.
Assessment of analytical similarity of tier 1 quality attributes is based on a set of hypotheses that tests the mean difference of reference and test products against a margin adjusted for standard deviation of the reference product. Thus, proper assessment of the biosimilarity hypothesis requires statistical tests that account for the uncertainty associated with the estimations of the mean differences and the standard deviation of the reference product. Recently, a linear reformulation of the biosimilarity hypothesis has been proposed, which facilitates development and implementation of statistical tests. These statistical tests account for the uncertainty in the estimation process of all the unknown parameters. In this paper, we survey methods for constructing confidence intervals for testing the linearized reformulation of the biosimilarity hypothesis and also compare the performance of the methods. We discuss test procedures using confidence intervals to make possible comparison among recently developed methods as well as other previously developed methods that have not been applied for demonstrating analytical similarity. A computer simulation study was conducted to compare the performance of the methods based on the ability to maintain the test size and power, as well as computational complexity. We demonstrate the methods using two example applications. At the end, we make recommendations concerning the use of the methods.  相似文献   
5.
6.
In 2015, the United Nations (UN) issued probabilistic population projections for all countries up to 2100, by simulating future levels of total fertility and life expectancy and combining the results using a standard cohort component projection method. For the 40 countries with generalized HIV/AIDS epidemics, the mortality projections used the Spectrum/Estimation and Projection Package (EPP) model, a complex, multistate model designed for short-term projections of policy-relevant quantities for the epidemic. We propose a simpler approach that is more compatible with existing UN projection methods for other countries. Changes in life expectancy are projected probabilistically using a simple time series regression and then converted to age- and sex-specific mortality rates using model life tables designed for countries with HIV/AIDS epidemics. These are then input to the cohort component method, as for other countries. The method performed well in an out-of-sample cross-validation experiment. It gives similar short-run projections to Spectrum/EPP, while being simpler and avoiding multistate modelling.  相似文献   
7.
为了对多个多属性(指标)待评价对象(方案)在多个时间点的发展状态和该时间段内的总体发展水平进行比较分析,根据理想解法和灰关联度法优缺点,提出基于理想解和灰关联度的动态评价方法。该方法基于三维数据,将欧氏距离和灰色关联度相结合,提出一种新贴近度,同时反映了位置关系和数据曲线的相似性差异,兼顾评价指标值差异程度和增长程度。最后将该方法应用于"十二五"期间省域循环经济生态效益评价,通过实例验证该方法实际应用上的有效性。  相似文献   
8.
Abstract

In this paper the second order asymptotics of the tail probabilities of randomly weighted sums and their maxima are established in the case that the underlying primary random variables are subexponential. No any assumption is made on the dependence structure between the random weights, but we assume these weights are bounded away from zero and infinity.  相似文献   
9.
We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelization and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the suggested estimators, and provide central limit theorems with expressions for asymptotic variances. We demonstrate how our method can make use of SMC in the state space models context, using Laplace approximations and time-discretized diffusions. Our experimental results are promising and show that the IS-type approach can provide substantial gains relative to an analogous DA scheme, and is often competitive even without parallelization.  相似文献   
10.
Abstract

We investigate an optimal investment problem of participating insurance contracts with mortality risk under minimum guarantee. The insurer aims to maximize the expected utility of the terminal payoff. Due to its piecewise payoff structure, this optimization problem is a non-concave utility maximization problem. We adopt a concavification technique and a Lagrange dual method to solve the problem and derive the representations of the optimal wealth process and trading strategies. We also carry out some numerical analysis to show how the portfolio insurance constraint impacts the optimal terminal wealth.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号